CS168: The Modern Algorithmic Toolbox Lecture #18: Linear and Convex Programming, with Applications to Sparse Recovery
نویسندگان
چکیده
Recall the setup in compressive sensing. There is an unknown signal z ∈ R, and we can only glean information about z through linear measurements. We choose m linear measurements a1, . . . , am ∈ R. “Nature” then chooses a signal z, and we receive the results b1 = 〈a1, z〉, . . . , bm = 〈am, z〉 of our measurements, when applied to z. The goal is then to recover z from b. Last lecture culminated in the following sparse recovery guarantee for compressive sensing.
منابع مشابه
CS168: The Modern Algorithmic Toolbox Lecture #14: Linear and Convex Programming, with Applications to Sparse Recovery
Recall the setup in compressive sensing. There is an unknown signal z ∈ R, and we can only glean information about z through linear measurements. We choose m linear measurements a1, . . . , am ∈ R. “Nature” then chooses a signal z, and we receive the results b1 = 〈a1, z〉, . . . , bm = 〈am, z〉 of our measurements, when applied to z. The goal is then to recover z from b. Last lecture culminated i...
متن کاملCS168: The Modern Algorithmic Toolbox Lecture #19: Expander Codes
The first half of the lecture explains an affirmative answer to this question. The second half connects the question to the design of error-correcting codes. What exactly do we mean by “sparse” and “highly connected?” The answers are familiar. One common definition of a sparse graph, as seen in CS161, is that the number m of edges is O(n), where n is the number of vertices. In this lecture we’l...
متن کاملCS168: The Modern Algorithmic Toolbox Lecture #10: Tensors, and Low-Rank Tensor Recovery
Definition 1.1 A n1× n2× . . .× nk k-tensor is a set of n1 · n2 · . . . · nk numbers, which one interprets as being arranged in a k-dimensional hypercube. Given such a k-tensor, A, we can refer to a specific element via Ai1,i2,...,ik . A 2-tensor is simply a matrix, with Ai,j referring to the i, jth entry. You should think of a n1×n2×n3 3-tensor as simply a stack of n3 matrices, where each matr...
متن کاملCS168: The Modern Algorithmic Toolbox Lecture #5: Sampling and Estimation
This week, we will cover tools for making inferences based on random samples drawn from some distribution of interest (e.g. a distribution over voter priorities, customer behavior, ip addresses, etc.). We will also learn how to use sampling techniques to solve hard problems— both problems that inherently involve randomness, as well as those that do not. As a warmup, to get into the probabilisti...
متن کاملCS168: The Modern Algorithmic Toolbox Lecture #13: Sampling and Estimation
This week, we will cover tools for making inferences based on random samples drawn from some distribution of interest (e.g. a distribution over voter priorities, customer behavior, ip addresses, etc.). We will also learn how to use sampling techniques to solve hard problems— both problems that inherently involve randomness, as well as those that do not. As a warmup, to get into the probabilisti...
متن کامل